Multiple Kernel Sphere with Large Margin for Novelty Detection

نویسندگان

  • X. M. Cheng
  • B. Zheng
  • W. J. Hu
چکیده

Novelty detection methods have been frequently applied in medical diagnosis, fault detection, network security and the discovery of new species. Among them, Support Vector Data Description (SVDD) has received considerable attention for its comprehensivedescription ability which covers the target data. Additionally, the Multiple Kernel Learning (MKL) technique has been extensively applied in machine learning methods; e.g. the SVM classifiers, dimensionality reduction techniques, etc. In this paper, we focus on the application of the MKL method on novelty detection (ND) and propose the new method of Multiple Kernel Sphere with Larger Margin (MKSLM) for novelty detection. In the presented method, the volume of the sphere is minimized while the margin between the surface of the sphere and the outliers are maximized to obtain a sphere with minimum size. An algorithm is also developed to solve the optimization problem. Experimental results over various real data sets have validated the superiority of the proposed methods

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Margin and Radius Based Multiple Kernel Learning

A serious drawback of kernel methods, and Support Vector Machines (SVM) in particular, is the difficulty in choosing a suitable kernel function for a given dataset. One of the approaches proposed to address this problem is Multiple Kernel Learning (MKL) in which several kernels are combined adaptively for a given dataset. Many of the existing MKL methods use the SVM objective function and try t...

متن کامل

Kernel PCA for novelty detection

Kernel principal component analysis (kernel PCA) is a non-linear extension of PCA. This study introduces and investigates the use of kernel PCA for novelty detection. Training data are mapped into an infinite-dimensional feature space. In this space, kernel PCA extracts the principal components of the data distribution. The squared distance to the corresponding principal subspace is the measure...

متن کامل

Gini Support Vector Machine: Quadratic Entropy Based Robust Multi-Class Probability Regression

Many classification tasks require estimation of output class probabilities for use as confidence scores or for inference integrated with other models. Probability estimates derived from large margin classifiers such as support vector machines (SVMs) are often unreliable. We extend SVM large margin classification to GiniSVM maximum entropy multi-class probability regression. GiniSVM combines a q...

متن کامل

Ratio-Based Multiple Kernel Clustering

Maximum margin clustering (MMC) approaches extend the large margin principle of SVM to unsupervised learning with considerable success. In this work, we utilize the ratio between the margin and the intra-cluster variance, to explicitly consider both the separation and the compactness of the clusters in the objective. Moreover, we employ multiple kernel learning (MKL) to jointly learn the kernel...

متن کامل

A Linear Programming Approach to Novelty Detection

Novelty detection involves modeling the normal behaviour of a system hence enabling detection of any divergence from normality. It has potential applications in many areas such as detection of machine damage or highlighting abnormal features in medical data. One approach is to build a hypothesis estimating the support of the normal data i.e. constructing a function which is positive in the regi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017